Comparing Word Representations for Implicit Discourse Relation Classification
نویسندگان
چکیده
This paper presents a detailed comparative framework for assessing the usefulness of unsupervised word representations for identifying so-called implicit discourse relations. Specifically, we compare standard one-hot word pair representations against low-dimensional ones based on Brown clusters and word embeddings. We also consider various word vector combination schemes for deriving discourse segment representations from word vectors, and compare representations based either on all words or limited to head words. Our main finding is that denser representations systematically outperform sparser ones and give state-of-the-art performance or above without the need for additional hand-crafted features.
منابع مشابه
Learning Connective-based Word Representations for Implicit Discourse Relation Identification
We introduce a simple semi-supervised approach to improve implicit discourse relation identification. This approach harnesses large amounts of automatically extracted discourse connectives along with their arguments to construct new distributional word representations. Specifically, we represent words in the space of discourse connectives as a way to directly encode their rhetorical function. E...
متن کاملImplicit Discourse Relation Recognition with Context-aware Character-enhanced Embeddings
For the task of implicit discourse relation recognition, traditional models utilizing manual features can suffer from data sparsity problem. Neural models provide a solution with distributed representations, which could encode the latent semantic information, and are suitable for recognizing semantic relations between argument pairs. However, conventional vector representations usually adopt em...
متن کاملDiscourse Relation Recognition by Comparing Various Units of Sentence Expression with Recursive Neural Network
We propose a method for implicit discourse relation recognition using a recursive neural network (RNN). Many previous studies have used the word-pair feature to compare the meaning of two sentences for implicit discourse relation recognition. Our proposed method differs in that we use various-sized sentence expression units and compare the meaning of the expressions between two sentences by con...
متن کاملImproving Implicit Discourse Relation Recognition with Discourse-specific Word Embeddings
We introduce a simple and effective method to learn discourse-specific word embeddings (DSWE) for implicit discourse relation recognition. Specifically, DSWE is learned by performing connective classification on massive explicit discourse data, and capable of capturing discourse relationships between words. On the PDTB data set, using DSWE as features achieves significant improvements over base...
متن کاملA Latent Variable Recurrent Neural Network for Discourse-Driven Language Models
This paper presents a novel latent variable recurrent neural network architecture for jointly modeling sequences of words and (possibly latent) discourse relations between adjacent sentences. A recurrent neural network generates individual words, thus reaping the benefits of discriminatively-trained vector representations. The discourse relations are represented with a latent variable, which ca...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2015